A No-Go Theorem for One-Layer Feedforward Networks
نویسندگان
چکیده
It is often hypothesized that a crucial role for recurrent connections in the brain is to constrain the set of possible response patterns, thereby shaping the neural code. This implies the existence of neural codes that cannot arise solely from feedforward processing. We set out to find such codes in the context of one-layer feedforward networks and identified a large class of combinatorial codes that indeed cannot be shaped by the feedforward architecture alone. However, these codes are difficult to distinguish from codes that share the same sets of maximal activity patterns in the presence of subtractive noise. When we coarsened the notion of combinatorial neural code to keep track of only maximal patterns, we found the surprising result that all such codes can in fact be realized by one-layer feedforward networks. This suggests that recurrent or many-layer feedforward architectures are not necessary for shaping the (coarse) combinatorial features of neural codes. In particular, it is not possible to infer a computational role for recurrent connections from the combinatorics of neural response patterns alone. Our proofs use mathematical tools from classical combinatorial topology, such as the nerve lemma and the existence of an inverse nerve. An unexpected corollary of our main result is that any prescribed (finite) homotopy type can be realized by a subset of the form [Formula: see text], where Ρ is a polyhedron.
منابع مشابه
Multilayer feedforward networks are universal approximators
This paper rigorously establishes thut standard rnultiluyer feedforward networks with as f&v us one hidden layer using arbitrary squashing functions ure capable of upproximating uny Bore1 measurable function from one finite dimensional space to another to any desired degree of uccuracy, provided sujficirntly muny hidden units are available. In this sense, multilayer feedforward networks are u c...
متن کاملLearning for VMM + WTA Embedded Classifiers
The authors present training and feedforward computation for a single layer of a VMM+WTA classifier. The experimental demonstration of the one-layer universal approximator encourages the use of one-layer networks for embedded low-power classification. The results enabling correct classification of each novel acoustic signal (generator, idle car, and idle truck). The classification structure req...
متن کاملRate and Synchrony in Feedforward Networks of Coincidence Detectors: Analytical Solution
We provide an analytical recurrent solution for the firing rates and cross-correlations of feedforward networks with arbitrary connectivity, excitatory or inhibitory, in response to steady-state spiking input to all neurons in the first network layer. Connections can go between any two layers as long as no loops are produced. Mean firing rates and pairwise cross-correlations of all input neuron...
متن کاملNeural networks with a continuous squashing function in the output are universal approximators
In 1989 Hornik as well as Funahashi established that multilayer feedforward networks without the squashing function in the output layer are universal approximators. This result has been often used improperly because it has been applied to multilayer feedforward networks with the squashing function in the output layer. In this paper, we will prove that also this kind of neural networks are unive...
متن کاملAn LMS Algorithm for Training Single Layer Globally Recursive Neural Networks
Unlike feedforward neural networks (FFNN) which can act as universal function approximaters, recursive neural networks have the potential to act as both universal function approximaters and universal system approximaters. In this paper, a globally recursive neural network least mean square (GRNNLMS) gradient descent or a real time recursive backpropagation (RTRBP) algorithm is developed for a s...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural computation
دوره 26 11 شماره
صفحات -
تاریخ انتشار 2014